Module 9 - Ensemble Models

Overview

We continue with our study of tree-based methods by learning about a group of tools that allow for multiple trees to be combined into an ensemble to increase model performance. This family of techniques includes bagging, boosting, random forests, and BART. Ensemble methods turn trees into some of the most powerful and widely applicable machine learning models, and have applications through machine learning.

Lab 5 is due at the end of the week.

Learning Objectives

  • Bagging, Boosting, Random Forests, and BART
  • Understanding hyperparameter choices for fitting these models

Readings

  • ISLP (Introduction to Statistical Learning): 8.2

Videos